Section: New Results
Exact continuous penalties for - minimization
Participants : Emmanuel Soubies, Laure Blanc-Féraud, Gilles Aubert.
We consider the following -regularized least squares problem
where , represents the data and is an hyperparameter characterizing the trade-off between data fidelity and sparsity. This problem finds a wide range of applications in signal/image processing, learning and coding areas among many others. We proposed a unified framework for exact continuous penalties approximating the -norm. In other words, we are concerned by the design of a class of continuous relaxations of , preserving all its global minimizers, and for which any local minimal point is also one of the initial functional. Hence, we highlight five necessary and sufficient conditions on the continuous penalty approximating the -norm ensuring that the minimizers of the underlying continuous relaxation of are consistent with those of . However, some local minimizer of the relaxed functional are not minimizer of which is an interesting point for such highly non-convex functional. This work offers a new way to compare penalties approximating the -norm. Finally, it is worth noting that the CEL0 penalty [1] , [14] , [17] is the inferior limit of the obtained class of penalties and seems to be the best choice to do in order to obtained an equivalent continuous reformulation of (1 ).